ConvNTM: Conversational Neural Topic Model

نویسندگان

چکیده

Topic models have been thoroughly investigated for multiple years due to their great potential in analyzing and understanding texts. Recently, researchers combine the study of topic with deep learning techniques, known as Neural Models (NTMs). However, existing NTMs are mainly tested based on general document modeling without considering different textual analysis scenarios. We assume that there characteristics model topics tasks. In this paper, we propose a Conversational Model (ConvNTM) designed particular conversational scenario. Unlike modeling, conversation session lasts turns: each short-text utterance complies single distribution these distributions dependent across turns. Moreover, roles conversations, a.k.a., speakers addressees. partially determined by such conversations. take factors into account conversations via multi-turn multi-role formulation. also leverage word co-occurrence relationship new training objective further improve quality. Comprehensive experimental results benchmark datasets demonstrate our proposed ConvNTM achieves best performance both typical downstream tasks within research (i.e., dialogue act classification response generation).

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Neural Conversational Model

Conversational modeling is an important task in natural language understanding and machine intelligence. Although previous approaches exist, they are often restricted to specific domains (e.g., booking an airline ticket) and require handcrafted rules. In this paper, we present a simple approach for this task which uses the recently proposed sequence to sequence framework. Our model converses by...

متن کامل

Latent Topic Conversational Models

Despite much success in many large-scale language tasks, sequence-to-sequence (seq2seq) models have not been an ideal choice for conversational modeling as they tend to generate generic and repetitive responses. In this paper, we propose a Latent Topic Conversational Model (LTCM) that augments the seq2seq model with a neural topic component to better model human-human conversations. The neural ...

متن کامل

A Neural Autoregressive Topic Model

We describe a new model for learning meaningful representations of text documents from an unlabeled collection of documents. This model is inspired by the recently proposed Replicated Softmax, an undirected graphical model of word counts that was shown to learn a better generative model and more meaningful document representations. Specifically, we take inspiration from the conditional mean-fie...

متن کامل

Topic Compositional Neural Language Model

We propose a Topic Compositional Neural Language Model (TCNLM), a novel method designed to simultaneously capture both the global semantic meaning and the local wordordering structure in a document. The TCNLM learns the global semantic coherence of a document via a neural topic model, and the probability of each learned latent topic is further used to build a Mixture-ofExperts (MoE) language mo...

متن کامل

A Hybrid Neural Network-Latent Topic Model

This paper introduces a hybrid model that combines a neural network with a latent topic model. The neural network provides a lowdimensional embedding for the input data, whose subsequent distribution is captured by the topic model. The neural network thus acts as a trainable feature extractor while the topic model captures the group structure of the data. Following an initial pretraining phase ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i11.26595